منابع مشابه
Multi-class AdaBoost
Boosting has been a very successful technique for solving the two-class classification problem. In going from two-class to multi-class classification, most algorithms have been restricted to reducing the multi-class classification problem to multiple two-class problems. In this paper, we develop a new algorithm that directly extends the AdaBoost algorithm to the multi-class case without reducin...
متن کاملA robust multi-class AdaBoost algorithm for mislabeled noisy data
AdaBoost has been theoretically and empirically proved to be a very successful ensemble learning algorithm, which iteratively generates a set of diverse weak learners and combines their outputs using the weighted majority voting rule as the final decision. However, in some cases, AdaBoost leads to overfitting especially for mislabeled noisy training examples, resulting in both its degraded gene...
متن کاملSolving Two-class Classification Problem Using Adaboost
This paper presents a learning algorithm based on AdaBoost for solving two-class classification problem. The concept of boosting is to combine several weak learners to form a highly accurate strong classifier. AdaBoost is fast and simple because it focuses on finding weak learning algorithms that only need to be better than random, instead of designing an algorithm that learns deliberately over...
متن کاملPredicting protein structural class with AdaBoost Learner.
The structural class is an important feature in characterizing the overall topological folding type of a protein or the domains therein. Prediction of protein structural classification has attracted the attention and efforts from many investigators. In this paper a novel predictor, the AdaBoost Learner, was introduced to deal with this problem. The essence of the AdaBoost Learner is that a comb...
متن کاملUnifying multi-class AdaBoost algorithms with binary base learners under the margin framework
Multi-class AdaBoost algorithms AdaBooost.MO, -ECC and -OC have received a great attention in the literature, but their relationships have not been fully examined to date. In this paper, we present a novel interpretation of the three algorithms, by showing that MO and ECC perform stage-wise functional gradient descent on a cost function defined over margin values, and that OC is a shrinkage ver...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics and Its Interface
سال: 2009
ISSN: 1938-7989,1938-7997
DOI: 10.4310/sii.2009.v2.n3.a8